|
In probability and statistics, a Bernoulli process is a finite or infinite sequence of binary random variables, so it is a discrete-time stochastic process that takes only two values, canonically 0 and 1. The component Bernoulli variables ''X''''i'' are identical and independent. Prosaically, a Bernoulli process is a repeated coin flipping, possibly with an unfair coin (but with consistent unfairness). Every variable ''X''''i'' in the sequence is associated with a Bernoulli trial or experiment. They all have the same Bernoulli distribution. Much of what can be said about the Bernoulli process can also be generalized to more than two outcomes (such as the process for a six-sided die); this generalization is known as the Bernoulli scheme. The problem of determining the process, given only a limited sample of the Bernoulli trials, may be called the problem of checking whether a coin is fair. ==Definition== A Bernoulli process is a finite or infinite sequence of independent random variables ''X''1, ''X''2, ''X''3, ..., such that * For each ''i'', the value of ''X''''i'' is either 0 or 1; * For all values of ''i'', the probability that ''X''''i'' = 1 is the same number ''p''. In other words, a Bernoulli process is a sequence of independent identically distributed Bernoulli trials. Independence of the trials implies that the process is memoryless. Given that the probability ''p'' is known, past outcomes provide no information about future outcomes. (If ''p'' is unknown, however, the past informs about the future indirectly, through inferences about ''p''.) If the process is infinite, then from any point the future trials constitute a Bernoulli process identical to the whole process, the fresh-start property. 抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)』 ■ウィキペディアで「Bernoulli process」の詳細全文を読む スポンサード リンク
|